basic science
The normalization of (almost) everything: Our minds can get used to anything, and even crises start feeling normal Science
For a long time, many climate scientists and advocates held onto an optimistic belief that once the impacts of climate change became undeniable, people and governments would act. But whereas the predictions of climate models have increasingly borne out, the assumptions about human behavior have not. Even as disasters mount, climate change remains low on voters' priority lists, and policy responses remain tepid. To me, this gap reflects a deeper failure--not just in policy or communication, but in how we understand human adaptability. When I began my career as a computational cognitive scientist, I was drawn to a defining strength of human cognition--a marked ability to adapt.
How to measure the returns on R&D spending
Forget the glorious successes of past breakthroughs--the real justification for research investment is what we get for our money. MIT Technology Review You can read more from the series here. Given the draconian cuts to US federal funding for science, including the administration's proposal to reduce the 2026 budgets of the National Institutes of Health by 40% and the National Science Foundation by 57%, it's worth asking some hard-nosed money questions: How much we be spending on R&D? How much value do we get out of such investments, anyway? To answer that, it's important to look at both successful returns and investments that went nowhere. How Trump's policies are affecting early-career scientists--in their own words Every year, we recognize extraordinary young researchers on our Innovators Under 35 list. Recent honorees told us how they're faring under the new administration.
- North America > United States > Texas (0.04)
- North America > United States > Massachusetts (0.04)
- Government > Regional Government > North America Government > United States Government (1.00)
- Banking & Finance > Economy (0.94)
- Health & Medicine (0.89)
Why basic science deserves our boldest investment
The humble inventions that power our modern world wouldn't have been possible without decades of support for early-stage research. In December 1947, three physicists at Bell Telephone Laboratories--John Bardeen, William Shockley, and Walter Brattain--built a compact electronic device using thin gold wires and a piece of germanium, a material known as a semiconductor. Their invention, later named the transistor (for which they were awarded the Nobel Prize in 1956), could amplify and switch electrical signals, marking a dramatic departure from the bulky and fragile vacuum tubes that had powered electronics until then. They were asking fundamental questions about how electrons behave in semiconductors, experimenting with surface states and electron mobility in germanium crystals. Over months of trial and refinement, they combined theoretical insights from quantum mechanics with hands-on experimentation in solid-state physics--work many might have dismissed as too basic, academic, or unprofitable. Their efforts culminated in a moment that now marks the dawn of the information age.
- North America > United States > New York (0.05)
- North America > United States > Massachusetts (0.05)
- Semiconductors & Electronics (0.68)
- Government > Regional Government (0.47)
From Protoscience to Epistemic Monoculture: How Benchmarking Set the Stage for the Deep Learning Revolution
Koch, Bernard J., Peterson, David
Over the past decade, AI research has focused heavily on building ever-larger deep learning models. This approach has simultaneously unlocked incredible achievements in science and technology, and hindered AI from overcoming long-standing limitations with respect to explainability, ethical harms, and environmental efficiency. Drawing on qualitative interviews and computational analyses, our three-part history of AI research traces the creation of this "epistemic monoculture" back to a radical reconceptualization of scientific progress that began in the late 1980s. In the first era of AI research (1950s-late 1980s), researchers and patrons approached AI as a "basic" science that would advance through autonomous exploration and organic assessments of progress (e.g., peer-review, theoretical consensus). The failure of this approach led to a retrenchment of funding in the 1980s. Amid this "AI Winter," an intervention by the U.S. government reoriented the field towards measurable progress on tasks of military and commercial interest. A new evaluation system called "benchmarking" provided an objective way to quantify progress on tasks by focusing exclusively on increasing predictive accuracy on example datasets. Distilling science down to verifiable metrics clarified the roles of scientists, allowed the field to rapidly integrate talent, and provided clear signals of significance and progress. But history has also revealed a tradeoff to this streamlined approach to science: the consolidation around external interests and inherent conservatism of benchmarking has disincentivized exploration beyond scaling monoculture. In the discussion, we explain how AI's monoculture offers a compelling challenge to the belief that basic, exploration-driven research is needed for scientific progress. Implications for the spread of AI monoculture to other sciences in the era of generative AI are also discussed.
- North America > United States > Massachusetts > Middlesex County > Cambridge (0.14)
- North America > Canada > Ontario > Toronto (0.14)
- North America > United States > Illinois > Cook County > Chicago (0.05)
- (14 more...)
- Leisure & Entertainment (1.00)
- Information Technology (1.00)
- Health & Medicine > Therapeutic Area (1.00)
- (3 more...)
The race of the AI labs heats up
The latest example, judging by the chatter in Silicon Valley, as well as on Wall Street and in corporate corner offices, newsrooms and classrooms around the world, is ChatGPT. In just five days after its unveiling in November the artificially intelligent chatbot, created by a startup called OpenAI, drew 1m users, making it one of the fastest consumer-product launches in history. Microsoft, which has just invested $10bn in OpenAI, wants ChatGPT-like powers, which include generating text, images, music and video that seem like they could have been created by humans, to infuse much of the software it sells. On January 26th Google published a paper describing a similar model that can create new music from a text description of a song. When Alphabet, its parent company, presents quarterly earnings on February 2nd, investors will be listening out for its answer to ChatGPT.
- Asia > China (0.48)
- North America > United States > New York > New York County > New York City (0.25)
- North America > United States > California (0.25)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.60)
The real threat from artificial intelligence – basic science - KSU
What do AI and chloroquine have in common? The reader has already understood the astronomical impact artificial intelligence (AI) has on businesses and governments, forcing large economies to make strategic plans for the technology. What not everyone understands yet are the real risks posed by the technology. A historical overview of artificial intelligence takes us on a roller coaster ride of exaggerated promises and gigantic disappointments. One of their milestones is the emergence of artificial neural networks (ANN) in 1958, when Frank Rosenblat invented the "Perceptron".
Is Data Science a science?
At its core, all fundamental science is about making predictions in the form of experiments: precise, quantifiable, falsifiable predictions. As Richard P. Feynman put it: "The fundamental principle of science, the definition almost, is this: the sole test of the validity of any idea is experiment." So if science is about making predictions, how is it different from the predictions that astrologers make? The core distinction is in the kinds of predictions each makes. Most horoscopes, for example, will give you general predictions.
AI, Data Science turn most popular courses at MGU
The growing demand for professionals in the areas of Artificial Intelligence and Data Science has reflected in the new-generation interdisciplinary programmes proposed by government and aided colleges under Mahatma Gandhi University (MGU) for the academic year 2020-21. The university had invited applications from higher educational institutions based on the government directive that such courses could be launched from November 1. The Higher Education Department had asked universities to initiate steps to launch new undergraduate and postgraduate programmes in innovative areas. They included four- and five-year programmes recommended by an expert committee set up by the government. The integrated M.Sc programme in Computer Science (Artificial Intelligence and Machine Learning) figured top among the innovative programmes proposed by the affiliated colleges.
Digital economy and AI high on the minds of China's tech leaders
Chinese tech leaders articulated their vision for the post-internet future at the World Internet Conference held this past week, with artificial intelligence, complete digitisation of the economy and a call for more basic science being among the topics of discussion – though it was mostly a local affair with the absence of high profile US tech company representatives amid the ongoing US-China trade war. "Artificial intelligence and the internet represent two different eras," said Baidu CEO Robin Li Yanhong on Thursday. "We will step into the AI era in the coming three to five decades while the previous 20 years belonged to internet." Baidu, which operates China's largest internet search engine, is a so-called AI national champion with its efforts in the field endorsed by the central government. It was also the first Chinese company to join an international AI ethics group set up last month, alongside members such as Apple and Alphabet's Google.
- Government (1.00)
- Information Technology > Services (0.31)
Toyota Invests 1 Billion in Artificial Intelligence in U.S.
The new effort by Toyota is also the latest indication of a changing of the guard in Silicon Valley's basic technology research. In September, when Dr. Pratt joined Toyota, the company announced an initial artificial intelligence research effort committing 50 million in funding to the computer science departments of both Stanford and M.I.T. In addition to focusing on navigation technologies, the new research corporation will also apply artificial intelligence technologies to Toyota's factory automation systems, Dr. Pratt said. A version of this article appears in print on November 6, 2015, on page B3 of the New York edition with the headline: Toyota Planning an Artificial Intelligence Research Center in California.
- Information Technology (1.00)
- Automobiles & Trucks > Manufacturer (1.00)